control robot
This AI-Powered Robot Keeps Going Even if You Attack It With a Chainsaw
A single AI model trained to control numerous robotic bodies can operate unfamiliar hardware and adapt eerily well to serious injuries. A four-legged robot that keeps crawling even after all four of its legs have been hacked off with a chainsaw is the stuff of nightmares for most people. For Deepak Pathak, cofounder and CEO of the startup Skild AI, the dystopian feat of adaptation is an encouraging sign of a new, more general kind of robotic intelligence. "This is something we call an omni-bodied brain," Pathak tells me. His startup developed the generalist artificial intelligence algorithm to address a key challenge with advancing robotics: "Any robot, any task, one brain.
- South America (0.05)
- North America > United States > California > San Francisco County > San Francisco (0.05)
- North America > Central America (0.05)
- (3 more...)
- Information Technology > Artificial Intelligence > Robots > Locomotion (0.51)
- Information Technology > Artificial Intelligence > Machine Learning > Neural Networks > Deep Learning (0.48)
- Information Technology > Artificial Intelligence > Games > Go (0.41)
- Information Technology > Artificial Intelligence > Natural Language > Large Language Model (0.31)
New capsule device lets you control robots with your entire body
Uber Eats uses four-wheeled robots to handle the final stretch of food delivery. H2L, a Tokyo-based technology startup, has launched the Capsule Interface. This breakthrough device lets you control robots with your entire body, transmitting not just movement but also physical force. This technology is poised to transform how humans interact with robots and digital avatars, offering a new level of immersion and precision. Sign up for my FREE CyberGuy Report Get my best tech tips, urgent security alerts and exclusive deals delivered straight to your inbox.
Thoughtful technology: We can now control robots - with our minds - Study Finds
Scientists from the University of Technology Sydney have developed new biosensor technology that actually makes mind reading possible! No, not like a fortune teller; this new technology allows people to operate devices, such as robots and machines, solely via thought-control. You think, and the robot acts. Researchers add this exciting breakthrough holds positive implications for the fields of healthcare, aerospace, and advanced manufacturing. This advanced brain-computer interface was developed by Distinguished Professor Chin-Teng Lin and Professor Francesca Iacopi, from the UTS Faculty of Engineering and IT, in collaboration with the Australian Army and Defence Innovation Hub.
- Health & Medicine (0.79)
- Government > Military > Army (0.40)
Microsoft trains ChatGPT to control robots
Robots still rely heavily on hand-written codes to perform their tasks, while humans find spoken language the most intuitive way to communicate. Microsoft has worked to alter this reality and "make natural human-robot interactions possible using OpenAI's new AI language model, ChatGPT." The team plans to leverage the platform's ability to develop coherent and grammatically correct responses to various prompts and questions and see if ChatGPT can think beyond the text and reason about the physical world to help with robotics tasks. "We want to help people interact with robots more easily, without needing to learn complex programming languages or details about robotic systems." The key obstacle in the way for a language model based on AI is to solve problems considering the laws of physics, the context of the operating environment, and how the robot's physical actions can change the state of the world.
- Information Technology > Artificial Intelligence > Robots (1.00)
- Information Technology > Artificial Intelligence > Natural Language > Large Language Model (1.00)
- Information Technology > Artificial Intelligence > Natural Language > Chatbot (1.00)
- Information Technology > Artificial Intelligence > Machine Learning > Neural Networks > Deep Learning (1.00)
ChatGPT and OpenAI: A Step Forward in Robot Control - Geek Metaverse
Microsoft and OpenAI have partnered up to explore the possibility of using the ChatGPT language model to control robots and drones using natural language. By doing so, the companies aim to simplify interactions between people and machines, without the need for complex programming languages. The goal is to enable ChatGPT to control robots, making it easier for people to interact with them, and improve communication through the implementation of artificial intelligence developed by OpenAI. Microsoft has released a paper that outlines a new set of design principles that utilizes ChatGPT to give instructions to robots. The process involves defining a list of high-level tasks that the robot can perform, writing an instruction that ChatGPT translates into the robot's language, and then running a simulation in which the robot follows the instructions.
- Information Technology > Artificial Intelligence > Natural Language > Large Language Model (1.00)
- Information Technology > Artificial Intelligence > Natural Language > Chatbot (1.00)
- Information Technology > Artificial Intelligence > Machine Learning > Neural Networks > Deep Learning > Generative AI (0.86)
Microsoft Researchers Are Using ChatGPT to Control Robots, Drones
ChatGPT is best known as an AI program capable of writing essays and answering questions, but now Microsoft is using the chatbot to control robots. On Monday, the company's researchers published(Opens in a new window) a paper on how ChatGPT can streamline the process of programming software commands to control various robots, such as mechanical arms and drones. "We still rely heavily on hand-written code to control robots," the researchers wrote. Microsoft's approach, on the other hand, taps ChatGPT to write some of the computer code. ChatGPT can do this because the AI model was trained on huge libraries of human text--including the code for software programs.
- Transportation > Air (0.40)
- Information Technology > Robotics & Automation (0.40)
- Information Technology > Artificial Intelligence > Robots (1.00)
- Information Technology > Artificial Intelligence > Natural Language > Large Language Model (1.00)
- Information Technology > Artificial Intelligence > Natural Language > Chatbot (1.00)
- Information Technology > Artificial Intelligence > Machine Learning > Neural Networks > Deep Learning (1.00)
The US Army is Building a Voice Assistant Named JUDI to Control Robots - Voicebot.ai
The United States Army is developing a conversational intelligence platform that will let soldiers give voice commands to robotic vehicles using natural language. Instead of requiring formal commands, the Joint Understanding and Dialogue Interface, JUDI, will be able to understand and interpret intent in its orders, clarifying them with questions as needed. The U.S. Army Combat Capabilities Development Command's Army Research Laboratory is building JUDI in a partnership with the University of Southern California's Institute for Creative Technologies. Their goal for JUDI is to combine an understanding of informal language with data from its sensors to grasp the context of its orders. In the robots used for testing right now, basically very advanced miniature cars JUDI will theoretically be able to take a single command like, "go to the top of the hill" and combine camera data identifying a nearby hill with its natural language processing to work out its goal and how to achieve it, with follow-up questions to the operator as needed.
- North America > United States > California (0.56)
- Asia > Russia (0.18)
- Europe > Russia (0.07)
- North America > United States > New York (0.05)
- Government > Regional Government > North America Government > United States Government (1.00)
- Government > Military > Army (1.00)
Could Artificial Intelligence Lead to Communism? - BlockDelta
Marx argued that under capitalism, everyone must work to live. We have some freedom to chose what type of work we do. But few of us have the choice not to work at all. Most of us need to find some particular task(s) we can do in exchange for a wage. And we cannot just walk away if we do not like it.
MIT uses brain signals and hand gestures to control robots
Robotic technology has a staggering range of applications, but getting it to perform adequately can be a challenge, requiring specific programming based around the way humans communicate with language. But now, researchers from MIT have developed a way to control robots more intuitively, using hand gestures and brainwaves. The team harnessed the power of brain signals called "error-related potentials" (ErrPs), which naturally occur when people notice a mistake. The system monitors the brain activity of a person observing robotic work, and if an ErrP occurs -- because the robot has made an error -- the robot pauses its activity so the user can correct it. This happens via an interface that measures muscle activity -- the person makes hand gestures to select the correct option for the robot.
- Health & Medicine > Therapeutic Area > Neurology (0.95)
- Health & Medicine > Health Care Technology (0.63)
MIT finds an easy way to control robots with your brain
You'd have to wear an EEG cap for the technique to work, since CSAIL's system needs to be able to read and record your brain activity. The machine-learning algorithms it created then classifies brain waves within 10 to 30 milliseconds, focusing on detecting "error-related potentials" or ErrPs. These are signals your brain generates when you spot a mistake. If you disagree with a robot's decision to, say, place a can of paint in a basket marked "wire," the system picks up on the ErrPs in your thoughts to correct the machine's course of action. "As you watch the robot, all you have to do is mentally agree or disagree with what it is doing. You don't have to train yourself to think in a certain way -- the machine adapts to you, and not the other way around."